274 research outputs found

    Use of static surrogates in hyperparameter optimization

    Full text link
    Optimizing the hyperparameters and architecture of a neural network is a long yet necessary phase in the development of any new application. This consuming process can benefit from the elaboration of strategies designed to quickly discard low quality configurations and focus on more promising candidates. This work aims at enhancing HyperNOMAD, a library that adapts a direct search derivative-free optimization algorithm to tune both the architecture and the training of a neural network simultaneously, by targeting two keys steps of its execution and exploiting cheap approximations in the form of static surrogates to trigger the early stopping of the evaluation of a configuration and the ranking of pools of candidates. These additions to HyperNOMAD are shown to improve on its resources consumption without harming the quality of the proposed solutions.Comment: http://www.optimization-online.org/DB_HTML/2021/03/8296.htm

    Tight-and-Cheap Conic Relaxation for the Optimal Reactive Power Dispatch Problem

    Get PDF
    The optimal reactive power dispatch (ORPD) problem is an alternating current optimal power flow (ACOPF) problem where discrete control devices for regulating the reactive power, such as shunt elements and tap changers, are considered. The ORPD problem is modelled as a mixed-integer nonlinear optimization problem and its complexity is increased compared to the ACOPF problem, which is highly nonconvex and generally hard to solve. Recently, convex relaxations of the ACOPF problem have attracted a significant interest since they can lead to global optimality. We propose a tight conic relaxation of the ORPD problem and show that a round-off technique applied with this relaxation leads to near-global optimal solutions with very small guaranteed optimality gaps, unlike with the nonconvex continuous relaxation. We report computational results on selected MATPOWER test cases with up to 3375 buses

    Tuning a variational autoencoder for data accountability problem in the Mars Science Laboratory ground data system

    Full text link
    The Mars Curiosity rover is frequently sending back engineering and science data that goes through a pipeline of systems before reaching its final destination at the mission operations center making it prone to volume loss and data corruption. A ground data system analysis (GDSA) team is charged with the monitoring of this flow of information and the detection of anomalies in that data in order to request a re-transmission when necessary. This work presents Δ\Delta-MADS, a derivative-free optimization method applied for tuning the architecture and hyperparameters of a variational autoencoder trained to detect the data with missing patches in order to assist the GDSA team in their mission

    Constrained stochastic blackbox optimization using a progressive barrier and probabilistic estimates

    Full text link
    This work introduces the StoMADS-PB algorithm for constrained stochastic blackbox optimization, which is an extension of the mesh adaptive direct-search (MADS) method originally developed for deterministic blackbox optimization under general constraints. The values of the objective and constraint functions are provided by a noisy blackbox, i.e., they can only be computed with random noise whose distribution is unknown. As in MADS, constraint violations are aggregated into a single constraint violation function. Since all functions values are numerically unavailable, StoMADS-PB uses estimates and introduces so-called probabilistic bounds for the violation. Such estimates and bounds obtained from stochastic observations are required to be accurate and reliable with high but fixed probabilities. The proposed method, which allows intermediate infeasible iterates, accepts new points using sufficient decrease conditions and imposing a threshold on the probabilistic bounds. Using Clarke nonsmooth calculus and martingale theory, Clarke stationarity convergence results for the objective and the violation function are derived with probability one

    Extensions à l'algorithme de recherche directe mads pour l'optimisation non lisse

    Get PDF
    Revue de la littérature sur les méthodes de recherche directe pour l'optimisation non lisse -- Démarche et organisation de la thèse -- Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search -- Parallel space decomposition of the mesh adaptive direct search algorithm -- Orthomads : a deterministic mads instance with orthogonal directions

    Quantifying uncertainty with ensembles of surrogates for blackbox optimization

    Full text link
    This work is in the context of blackbox optimization where the functions defining the problem are expensive to evaluate and where no derivatives are available. A tried and tested technique is to build surrogates of the objective and the constraints in order to conduct the optimization at a cheaper computational cost. This work proposes different uncertainty measures when using ensembles of surrogates. The resulting combination of an ensemble of surrogates with our measures behaves as a stochastic model and allows the use of efficient Bayesian optimization tools. The method is incorporated in the search step of the mesh adaptive direct search (MADS) algorithm to improve the exploration of the search space. Computational experiments are conducted on seven analytical problems, two multi-disciplinary optimization problems and two simulation problems. The results show that the proposed approach solves expensive simulation-based problems at a greater precision and with a lower computational effort than stochastic models.Comment: 36 pages, 11 figures, submitte
    • …
    corecore